How Deep Learning is Being Made More Robust, More Domain-Aware, and More Capable of Generalization Through the Influence of Algebra and Topology
Henry Kvinge (Pacific Northwest National Lab)
Abstract: Driven by enormous amounts of data and compute, deep learning-based models continue to surpass yesterday’s benchmarks. In this fast-growing field where machine learning (ML) is applied in more and more domains, there is a constant need for new ways of looking at problems. Recent years have seen the rise of tools derived from topology and algebra, which are not traditionally associated with ML. In this talk I will begin by surveying some of the recent applications of these fields in ML, from hardcoding equivariance into vision models, to using sheaves to better enable learning on graphs. I will argue that pure mathematics will increasingly offer critical tools necessary for a more mature approach to machine learning. I will end by discussing some of my team’s recent work which, inspired by the notion of a fiber bundle, developed a novel deep learning architecture to solve a challenging problem in materials science.
machine learningmathematical physicsalgebraic geometryalgebraic topologynumber theory
Audience: researchers in the topic
DANGER2: Data, Numbers, and Geometry
| Organizers: | Alexander Kasprzyk*, Thomas Oliver, Yang-Hui He |
| *contact for this listing |
